'Design Dialogues II'

24.03.2020


How well are we aware of our environment?

For this, I don't mean the natural environment or the living organisms but the physical and spatial context that surrounds us. We were taught that there are five senses (hearing, sight, smell, taste, and touch). Each one of them carries a different impression to our brain.

There have been many studies, pointing out that humans have more senses. And it all depends on how we classify or divide our sensory systems.



Here, I point out a few papers and books that I found interesting. Rudolf Steiner1 and Albert Soesman2, mentioned that there are 12 senses, and by breaking down the ordinary senses we can find interconnections with others like the sense of thought, speech, imagination, equilibrium, warmth, among others. Bruce Durie3 also broke down the classification and categorized senses according to different modalities. Constance Classen4 and Asija Majid5 focused on how perceiving is related to the culture and time we belong. Cretien van Campen6 focused on synesthesia and finally David Eagleman7 on substituting for teaching and understanding through a different sense.

In all of their studies, there is a prevalent pattern. Perception is what matters because our experiences will always be multi-layered.

If we are visual creatures and the world is mostly built for seeing. What if I focus on the visually impaired and blind to see and learn what another way we can play with our senses. I decided to focus on touch and hearing. Since the beginning, I realized that the project was heading more to a synesthetic experience. It is essential to train and alter our way of perceiving and stimuli for then activating our memory and, lastly be able to learn and act according to the new perceptions.



I joined as a volunteer to Fundación Once, the Spanish national organization for the blind and visually impaired, and started working mainly with two ladies. Since entering the organization. I realized the amount of time it takes for them to perform a simple task - by this, I mean that for something we do in a minute, they might take the double or triple of time how it is also super important for them to use of their memory. Even though we can think they are pretty autonomous, there is a huge need for extra help. And also, something super important is how technologies adapt to their needs, but I feel like they are still are far from being efficient.



As a concept, I chose the mixing of technology and senses to gain a new perspective or understanding of art (as I'm very passionate about it). I researched on how museums are dealing with this. Most of them make 3d models or replicas. And this is entirely understandable because schools teach the blind and visually impaired to focus mainly on their touch. Then a few are trying with smelling features and raised images. Yet, I believe there has to be another way to approach art.



First intervention

After this first intervention, I realized machine learning was not going to be as useful as I thought for making the descriptions as I don't have a solid database, and I need months or years to make it reliable. Also that, touch even though it is powerful, I needed to explore more rather than just 3d printing. Maybe by using vibrating or temperature sensors. But I decided to take another path, another way to translate visual art.



Second intervention

I found the outcomes of the second intervention pretty great, and to continue with this melodic code. I think I'll do a small gadget in Fab Academy that can help you read through sounds an artwork if you scan it and then play it. I know this is very exploratory, so I now see this project more of a tool for educational purposes rather than something for the visually impaired.

After this intervention, I talked to Mariana Quintero, our tutor, and she helped me realize that this could also be a tool. So, If we can memorize sounds and relate them to colors in our minds, would it be possible to link the sounds directly to hand gestures or movements? What if, as we move our hands, a computer captures the gestures and colors a canvas? Can we define shapes by the way we move? To this point, I've been trying to translate visual things to other senses, and Now I'm starting to wonder if we can make visual images out from other senses, specifically self-movement and hearing. Until now, I've been exploring the different possibilities of touch and hearing for making art approachable to the blind, the visually impaired, or any standard user that is open to understanding visual things through another sensory organ. Just as artists work with new mediums, such is the intent of this research. Mediums can suggest multiple approaches and possibilities for evolution. I've been able to make more exploration at home on existing technologies and opportunities when using sensors and Machine Learning. Imagine we train our brains to memorize sounds. Similar to the exercise of the second intervention when relating sounds to colors, nevertheless now with a different approach. The idea is to recreate images from sounds where frequencies are the user's brush to paint digitally and trigger those sounds using gestures with the intent of redefining visual tools and, therefore, drawings, paintings, or even art.

An idea of how it could work + First try with ML.

Even so, I feel the results shown are far from what I had in mind. It is a path I'll need to keep exploring and learning. Meanwhile, during Fab Academy, I've been using sensors, which I found pretty exciting because of the capability of linking them to other tools.



Input Devices

I'd love to print the results as raised images for the blind and visually impaired to touch the outcome. Within these parameters, I can still explore the methodologies of production and tools used to reinvent then and provide a new suggestion of what a medium could be. Could it be a new perceptual technology that adapts to the contemporary practices in the arts?



Next steps.



Bibliography:
1 - Steiner, Rudolf. Man’s Twelve Senses in Their Relation to Imagination, Inspiration, Intuition. Translation Stephen Briaul, 1920, www.waldorflibrary.org/images/stories/articles/twelvesenses.pdf.
2- Soesman, Albert. Our Twelve Senses: How Healthy Senses Refresh the Soul. Hawthorn Press, 1998.
3 - Durie, Bruce. “Senses Special: Doors of Perception.” New Scientist, 26 Jan. 2005, www.newscientist.com/article/mg18524841-600-senses-special-doors-of-perception/.
4 - Classen, Constance. Worlds of Sense: Exploring the Senses in History and across Cultures. Routledge, 1993.
5 - Majid, Asifa, et al. “Differential Coding of Perception in the World’s Languages.” PNAS, National Academy of Sciences, 6 Nov. 2018, www.pnas.org/content/115/45/11369.
6 - Campen Crétien van. The Hidden Sense: Synesthesia in Art and Science. MIT Press, 2010.
7 - Eagleman, David. Sensory Substitution, eagleman.com/research/sensory-substitution.
8 - Atlas. “El Prado: Prohibido No Tocar, Estas Obras Son Para Ciegos.” 20minutos, 19 Jan. 2015, www.20minutos.es/noticia/2350985/0/cuadros-para-ciegos/obras-maestras/museo-prado/.
9 - Alba, and Davide. “Prohibido No Tocar: Museo Tiflológico, Museo Para Ciegos.” Letras A Ciegas, 11 Mar. 2019, letrasaciegas.com/museo-para-ciegos/.
10- “Accessibility.” Louvre, 2 Apr. 2020, www.louvre.fr/en/accessibility.
11 - “Accessibility.” Guggenheim, 2 Apr. 2020, www.guggenheim.org/accessibility.
12 - Metmuseum.org, www.metmuseum.org/visit/accessibility.
13 - “i-Map.” Tate, www2.tate.org.uk/imap/imap2/userguide.shtml.